AAAI AI-Alert for May 17, 2022
Predicting Others' Behavior on the Road With Artificial Intelligence
Researchers have created a machine-learning system that efficiently predicts the future trajectories of multiple road users, like drivers, cyclists, and pedestrians, which could enable an autonomous vehicle to more safely navigate city streets. If a robot is going to navigate a vehicle safely through downtown Boston, it must be able to predict what nearby drivers, cyclists, and pedestrians are going to do next. A new machine-learning system may someday help driverless cars predict the next moves of nearby drivers, pedestrians, and cyclists in real-time. Humans may be one of the biggest roadblocks to fully autonomous vehicles operating on city streets. If a robot is going to navigate a vehicle safely through downtown Boston, it must be able to predict what nearby drivers, pedestrians, and cyclists are going to do next.
Why Some Instagram And Facebook Filters Can't Be Used In Texas After Lawsuit
Instagram and Facebook users in Texas lost access to certain augmented reality filters Wednesday, following a lawsuit accusing parent company Meta of violating privacy laws. In February, Texas Attorney General Ken Paxton revealed he would sue Meta for using facial recognition in filters to collect data for commercial purposes without consent. Paxton claimed Meta was "storing millions of biometric identifiers" that included voiceprints, retina or iris scans, and hand and face geometry. Although Meta argued it does not use facial recognition technology, it has disabled its AR filters and avatars on Facebook and Instagram amid the litigation. The AR effects featured on Facebook, Messenger, Messenger Kids, and Portal will also be shut down for Texas users.
Autonomous Vehicle with 2D Lidar
Lidar is an acronym for light detection and ranging. Lidar is like radar, except that it uses light instead of radio waves. The light source is a laser. A lidar sends out light pulses and measures the time it takes for a reflection bouncing off a remote object to return to the device. As the speed of light is a known constant, the distance to the object can be calculated from the travel time of the light pulse (Figure 1).
AI Can Now Understand Your Videos by Watching Them
A new artificial intelligence system (AI) could watch and listen to your videos and label things that are happening. MIT researchers have developed a technique that teaches AI to capture actions shared between video and audio. For example, their method can understand that the act of a baby crying in a video is related to the spoken word "crying" in a sound clip. It's part of an effort to teach AI how to understand concepts that humans have no trouble learning, but that computers find hard to grasp. "The prevalent learning paradigm, supervised learning, works well when you have datasets that are well described and complete," AI expert Phil Winder told Lifewire in an email interview.
Seoul launches VR simulator to test autonomous driving
The Seoul Metropolitan Government (SMG) has announced it is building a pilot driving zone for autonomous cars. Forming part of the cooperative intelligent transport system (C-ITS) construction project, the virtual reality autonomous driving simulator will reflect road, traffic, and weather conditions by using digital twin technologies. According to SMG, by expanding the virtual territory to Gangnam and the city centre, it will enable Seoul to "leap forward" as a city of commercialised self-driving vehicles. The autonomous driving simulator will be open to the public, and anyone from companies to research institutes, start-ups, and universities can use it free of charge. SMG's rationale is the greater the numbers of developers who test the simulator the more opportunity there is to improve their technologies, and help the industry to further advance.
AI-engineered enzyme eats entire plastic containers
A plastic-degrading enzyme enhanced by amino acid changes designed by a machine-learning algorithm can depolymerise polyethylene terephthalate (PET) at least twice as fast and at lower temperatures than the next best engineered enzyme. Six years ago scientists sifting through debris of a plastic bottle recycling plant discovered a bacterium that can degrade PET. The organism has two enzymes that hydrolyse the polymer first into mono-(2-hydroxyethyl) terephthalate and then into ethylene glycol and terephthalic acid to use as an energy source. One enzyme in particular, PETase, has become the target of protein engineering efforts to make it stable at higher temperatures and boost its catalytic activity. A team around Hal Alper from the University of Texas at Austin in the US has created a PETase that can degrade 51 different PET products, including whole plastic containers and bottles.
A Smarter Way To Develop New Drugs Using Artificial Intelligence
MIT scientists have developed a machine learning model that proposes new molecules for the drug discovery process, while ensuring the molecules it suggests can actually be synthesized in a laboratory. A new artificial intelligence technique has been developed that only proposes candidate molecules that can actually be produced in a lab. Pharmaceutical companies are using artificial intelligence to streamline the process of discovering new medicines. Machine-learning models can propose new molecules that have specific properties which could fight certain diseases, accomplishing in minutes what might take humans months to achieve manually. But there's a major hurdle that holds these systems back: The models frequently suggest new molecular structures that are difficult or impossible to produce in a laboratory.